Simple and Effective Complementary Label Learning Based on Mean Square Error Loss
نویسندگان
چکیده
Abstract In this paper, we propose a simple and effective complementary label learning approach to address the noise problem for deep model. Different surrogate losses have been proposed learning, however, are often sophisticated designed, as required satisfy classifier consistency property. We an square loss under unbiased biased assumptions. also show theoretically that our method assurances optimal labels is ordinary labels. Finally, test on three different benchmark datasets with assumptions verify effectiveness of method.
منابع مشابه
Backpropagation learning algorithms for classification with fuzzy mean square error
Most of the real life classification problems have ill defined, imprecise or fuzzy class boundaries. Feedforward neural networks with conventional backpropagation learning algorithm are not tailored to this kind of classification problem. Hence, in this paper, feedforward neural networks, that use backpropagation learning algorithm with fuzzy objective functions, are investigated. A learning al...
متن کاملOptimum thresholding using mean and conditional mean square error
We consider a univariate semimartingale model for (the logarithm of) an asset price, containing jumps having possibly infinite activity (IA). The nonparametric threshold estimator ˆ IV n of the integrated variance IV := ∫ T 0 σ sds proposed in [6] is constructed using observations on a discrete time grid, and precisely it sums up the squared increments of the process when they are under a thres...
متن کاملStructural Content Laplacian Mean Square Error
Measurement of the quality of image compression is important for image processing application. In this paper, we propose an objective image quality assessment to measure the quality of gray scale compressed image, which is correlation well with subjective quality measurement (MOS) and least time taken. The new objective image quality measurement is developed from a few fundamental of objective ...
متن کاملRobust Learning with Kernel Mean p-Power Error Loss
Correntropy is a second order statistical measure in kernel space, which has been successfully applied in robust learning and signal processing. In this paper, we define a nonsecond order statistical measure in kernel space, called the kernel mean-p power error (KMPE), including the correntropic loss (C-Loss) as a special case. Some basic properties of KMPE are presented. In particular, we appl...
متن کاملOn the mean square error of randomized averaging algorithms
This paper regards randomized discrete-time consensus systems that preserve the average on expectation. As a main result, we provide an upper bound on the mean square deviation of the consensus value from the initial average. Then, we particularize our result to systems where the interactions which take place simultaneously are few, or weakly correlated; these assumptions cover several algorith...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of physics
سال: 2023
ISSN: ['0022-3700', '1747-3721', '0368-3508', '1747-3713']
DOI: https://doi.org/10.1088/1742-6596/2504/1/012016